Multidimensional Chebyshev's inequality

From HandWiki

In probability theory, the multidimensional Chebyshev's inequality is a generalization of Chebyshev's inequality, which puts a bound on the probability of the event that a random variable differs from its expected value by more than a specified amount.

Let X be an N-dimensional random vector with expected value μ=E[X] and covariance matrix

V=E[(Xμ)(Xμ)T].

If V is a positive-definite matrix, for any real number t>0:

Pr((Xμ)TV1(Xμ)>t)Nt2

Proof

Since V is positive-definite, so is V1. Define the random variable

y=(Xμ)TV1(Xμ).

Since y is positive, Markov's inequality holds:

Pr((Xμ)TV1(Xμ)>t)=Pr(y>t)=Pr(y>t2)E[y]t2.

Finally,

E[y]=E[(Xμ)TV1(Xμ)]=E[trace(V1(Xμ)(Xμ)T)]=trace(V1V)=N.